Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Can ephapticity contribute to brain complexity?

  • Gabriel Moreno Cunha,

    Roles Conceptualization, Data curation, Formal analysis, Methodology, Project administration, Software, Visualization, Writing – original draft, Writing – review & editing

    Affiliations Departamento de Física Teórica e Experimental, Universidade Federal do Rio Grande do Norte, Natal, RN, Brazil, Laboratório de Simulação e Modelagem Neurodinâmica, Universidade Federal do Rio Grande do Norte, Natal, RN, Brazil

  • Gilberto Corso ,

    Contributed equally to this work with: Gilberto Corso, Matheus Phellipe Brasil de Sousa, Gustavo Zampier dos Santos Lima

    Roles Methodology, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliations Departamento de Física Teórica e Experimental, Universidade Federal do Rio Grande do Norte, Natal, RN, Brazil, Departamento de Biofísica e Farmacologia, Universidade Federal do Rio Grande do Norte, Natal, RN, Brazil

  • Matheus Phellipe Brasil de Sousa ,

    Contributed equally to this work with: Gilberto Corso, Matheus Phellipe Brasil de Sousa, Gustavo Zampier dos Santos Lima

    Roles Methodology, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliations Departamento de Física Teórica e Experimental, Universidade Federal do Rio Grande do Norte, Natal, RN, Brazil, Laboratório de Simulação e Modelagem Neurodinâmica, Universidade Federal do Rio Grande do Norte, Natal, RN, Brazil

  • Gustavo Zampier dos Santos Lima

    Contributed equally to this work with: Gilberto Corso, Matheus Phellipe Brasil de Sousa, Gustavo Zampier dos Santos Lima

    Roles Data curation, Formal analysis, Investigation, Methodology, Project administration, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    gustavo.zampier@ufrn.br

    Affiliations Departamento de Física Teórica e Experimental, Universidade Federal do Rio Grande do Norte, Natal, RN, Brazil, Escola de Ciências e Tecnologia, Universidade Federal do Rio Grande do Norte, Natal, RN, Brazil, Laboratório de Simulação e Modelagem Neurodinâmica, Universidade Federal do Rio Grande do Norte, Natal, RN, Brazil, Institut Camille Jordan, UMR 5208 CNRS, University Lyon 1, Villeurbanne, France

Abstract

The inquiry into the origin of brain complexity remains a pivotal question in neuroscience. While synaptic stimuli are acknowledged as significant, their efficacy often falls short in elucidating the extensive interconnections of the brain and nuanced levels of cognitive integration. Recent advances in neuroscience have brought the mechanisms underlying the generation of highly intricate dynamics, emergent patterns, and sophisticated oscillatory signals into question. Within this context, our study, in alignment with current research, postulates the hypothesis that ephaptic communication, in addition to synaptic mediation’s, may emerge as a prime candidate for unraveling optimal brain complexity. Ephaptic communication, hitherto little studied, refers to direct interactions of the electric field between adjacent neurons, without the mediation of traditional synapses (electrical or chemical). We propose that these electric field couplings may provide an additional layer of connectivity that facilitates the formation of complex patterns and emergent dynamics in the brain. In this investigation, we conducted a comparative analysis between two types of networks utilizing the Quadratic Integrate-and-Fire Ephaptic model (QIF-E): (I) a small-world synaptic network (ephaptic-off) and (II) a mixed composite network comprising a small-world synaptic network with the addition of an ephaptic network (ephaptic-on). Utilizing the Multiscale Entropy methodology, we conducted an in-depth analysis of the responses generated by both network configurations, with complexity assessed by integrating across all temporal scales. Our findings demonstrate that ephaptic coupling enhances complexity under specific topological conditions, considering variables such as time, spatial scales, and synaptic intensity. These results offer fresh insights into the dynamics of communication within the nervous system and underscore the fundamental role of ephapticity in regulating complex brain functions.

1 Introduction

The brain can be understood as a sophisticated system in which mental states arise from interactions that span multiple levels encompassing physical and functional aspects [13]. The human mind is an intricate phenomenon that develops beneath the structural complexity of the brain [4, 5]. However, the precise nature of the mind-brain connection remains elusive, and a full understanding has yet to be achieved. The structure of the brain spans multiple temporal and spatial dimensions, giving rise to sophisticated cellular and neuronal phenomena that collectively constitute the physical basis of cognition [6]. In the spatial dimension, the cerebral organization exhibits similar patterns at various resolutions in the distribution of cells throughout the brain [7, 8]. In the temporal dimension, for example, there are modules for short-term and long-term memory [9, 10]. The architecture of the brain is inextricably linked to its connectivity, both in terms of function and structure [11].

The flow of information between neurons through synaptic firing patterns has always been considered the fundamental basis of neuronal processes, encompassing essential functions such as memory and consciousness [1214]. As we advance in our understanding of the brain, it is increasingly recognized that the complexity of neuronal communication goes beyond synaptic connectivity [14]. In addition to synapses, adjacent electrical fields, known as ephaptics, are emerging as protagonists in modulating neuronal architecture and influencing functional responses [4, 5, 15, 16]. Ephaptic communication refers to cases in which neighbor neurons establish electrical connections and modulate extracellular flow [1719]. This subtle electrical interaction highlights the harmonious interconnection that goes beyond synapses and adds a new dimension to the understanding of communication in the brain. Ephapticity emerges as narrative of neuronal complexity, suggesting that brain communication may transcend the boundaries of known synapses. [19, 20]. Due to the short range of the electric fields, the ephapticity generated by a neuron affects neighbor neurons [15, 16]. This phenomenon has also been observed in a study in which electrical inhibition was induced in rat cells [21]. Furthermore, ephaptic coupling has been identified as a crucial factor in governing synchronization and spike timing in neurons [17, 22, 23].

In the study, coworker presented a compelling argument suggesting that memory formation in the brain is associated with ephaptic processes that intrinsically shape and control neuronal activity by establishing connections between the brain areas [24]. Their study provided empirical evidence for ephaptic coupling between two cortical regions in vivo. The results strongly suggest that ephaptic coupling, driven by electric fields, plays a causal role in local neuronal activity. It is interesting to note that neuronal activity, under the influence of ephaptic coupling, showed, for a lower transmission of information, greater variability and complexity. In another study, Hunt et al. provided convincing evidence that oscillating electromagnetic (EM) fields play a pivotal role in steering and unifying conscious cognition [25]. Their study suggests that EM fields are not just by-products of brain functions but that they trigger various crucial functions. There is a possibility that the brain’s local and global electromagnetic fields may actually serve as a central locus of consciousness [25].

Neuronal ephaptic communication, which is essential for neuronal function, increases complexity through direct electrical interactions. This phenomenon, often overlooked in neuronal models, emphasizes the need for a more comprehensive understanding of the intricate processes in the brain. With this in mind, in this work we explore the role of ephaptic communication in parallel with synaptic networks, using a small-world network to model of the brain complexity [3, 2630]. To this end, we performed various simulations of network structures using a small-world synaptic topology in two cases: (I) synaptic network only (ephaptic-off); (II) synaptic and ephaptic network (ephaptic-on). For the ephaptic-off network, we changed the following topological parameters: the number of neurons (N), the rewiring probability (rp) of the small world, and the synaptic intensity (ω(k)) of the connection. Moreover, the coupling of the ephaptic network is all-to-all with weights that depend on the distance of the neurons. To quantify the complexity, we used the multiscale entropy integration, a Shannon-like entropy developed for several time scales. The multiscale entropy (MSE) is based on the work of Zhang and the method of Costa et al. [3133]. Our results show not only that ephaptic communication necessarily contributes to explaining the complexity of the brain, but also that the balance between synaptic and ephaptic processes is essential for maintaining brain functionality.

This paper is divided into four sections as follows: The “Material and methods” section shows how the QIF-E model is developed by the current ephaptic coupling approach. We then present the network model, both for small-world ephaptic-off networks and for networks with ephaptic coupling. We also discuss the mathematical tool called Multiscale Entropy (MSE), which quantifies the complexity of the neuronal network. In the “Results” section, the numerical simulations and data analysis are presented. Finally, the Discussion section provides a new perspective for understanding the complexity of the brain by considering the balance between synaptic and ephaptic communication.

2 Materials and methods

2.1 Firing neuron model with ephaptic coupling

The quadratic integrate- and-fire model with ephapticity (QIF-E) [34, 35] is a simplified neuron model, it is an integrate- and-fire neuron model that describes spikes in neurons inserted in an electric field given by the LFP, referred to here as the ephaptic term (Eq (1)). In contrast to physiologically accurate but computationally expensive neuronal models, the QIF-E model generates a standard action potential-like pattern and ignores subtleties such as control variables. According to Cunha et al. [34], the ephaptic communication can be simulated by the following QIF-E hybrid model: (1) where Vm is the membrane potential difference, a and b are parameters related to the electrical properties of the neuron membrane, such as membrane resistance and capacitance. The parameter c is the ephaptic weight, which is based on the electrophysiological properties of the extracellular and membrane milieu [34, 35]. The current terms (Iephap and I0) are related to communication stimuli. The Eq (1) describe the QIF-E neuron time evolution, . The first term in the right side of the Eq (1) provide the intrinsic membrane dynamics. The second term, c.Iephap(t) summarize the ephaptic interactions between neural elements. Finally, the I0(t) term express the synaptic communications between neurons [34, 35].

To perform the ephaptic coupling in the QIF-E, we assume that all neurons are approximately spherical (SOMA), and we do not consider the propagation effects of the spikes along the axon. Therefore, the ephaptic term in Eq (1) is estimated assuming that the membrane behaves like an electrical circuit (see Fig 1(b)). Assuming that the membrane of neurons i and j can be modeled by this description, the ephaptic current in Eq (1) is given by: (2) where Cm is the membrane capacitance and Rext is the extracellular resistance. In the approach by Shifman & Lewis [19], which uses the concepts of other works [17, 34, 3638], the Eq (2) can be described by: (3) which corresponds to the ephaptic transmembrane current in the neuron (1) due to the ephaptic field of the neuron (2) [see Fig 1(a)]. Here, Rm is the membrane resistance and Rext is the resistance of the extracellular milieu [17, 3437]. This model follows the electrical circuit approach (see Fig 1(b)).

thumbnail
Fig 1. Ephaptic coupling model.

(a) Representation of the ephaptic coupling mechanism between two neurons. The first neuron generates an electrical field in the extracellular space, which then influences the second neuron through this potential difference. This form of communication, where electrical field interactions between adjacent neurons occur, is referred to as ephaptic coupling. (b) The equivalent circuit of ephaptic coupling is simulated using the Quadratic Integrate-and-Fire (QIF-E) model. (c) When ephaptic coupling is turned off, the network is modeled using a small-world topology. This arrangement reflects a typical synaptic network structure. (d) In contrast, the brain network is mimicked by an ephaptic-on model, which incorporates distance-dependent interactions between neurons. The red line width indicates the distance between neurons in this model. (e) This panel shows the temporal series for all neurons in the ephaptic-off network, averaged spatially (see Eq (7)). (f) This panel shows the temporal series for all neurons in the ephaptic-on network, averaged spatially (see Eq (7)).

https://doi.org/10.1371/journal.pone.0310640.g001

In this way, the QIF-E equation for ephaptic coupling is obtained by substituting Eq (3) into Eq (1). The case of the N-neuron system is calculated by applying the principle of superposition of electric potentials. The general QIF-E equations for the N-neuron system are therefore as follows: (4) where is the voltage in the i-th neuron membrane. In addition, the term c(j) expresses the weight of the ephaptic coupling for the j-th neuron coupled to the i-th neuron. Note, the c(j) parameters have the same interpretation of the c parameter mentioned above. The additional sub-index is added to indicate that for each pair of neurons, (i, j), corresponds a different coupling constant. The c(j) changes to each neuron pair is due to Euclidean distances being different for each neuron pair. Therefore, the ephaptic coupling constants, are also different.

In addition, the term corresponds to the synaptic input. In this work, the CUBA model was applied for synaptic modeling [39, 40] which shows an exponential decay (see [39, 40]): (5) where the synaptic intensity, ω(k), is not equal to zero in the presence of a synaptic connection between the neurons i and k. Furthermore, indicates the time at which the presynaptic (k) neuron had a spike. In addition, the parameters a and b are chosen in a ∈ [25±1.25] and b ∈ [30±1.5] to mimetic the biological difference between neurons.

In the next section, we discuss the complex network used in the simulation. We also introduce the QIF-E equation, which performs ephaptic coupling.

2.2 Complex network model

The synaptic connection between the neurons follows a small-world topology [41]. The small-world network ensures that the information of the system has a short length and a high clustering coefficient [3, 41]. This topological arrangement is often used to describe the propagation of information in the brain, mainly because of its low energetic cost [3, 2630]. These two properties are commonly associated with brain structure [3, 2630]. In our study, most simulations were performed with N = 100 neurons, with four first neighbors and a synaptic rewiring probability (rp) that varies from 0% (regular network) to 100% (aleatory network) (Fig 1(c) and 1(d), blue). To mimic the ephaptic network, the “all-to-all” topology was used, where the weighting is determined by the distance between neurons. Indeed, the ephaptic current depends on the electric field, which decreases quadratically with distance [17, 19, 3437]. Moreover, the electric field can be summed due to the superposition principle (Fig 1(d)). In this work, the weights were estimated to the half of the network with an initial distance of d = 50μm multiplied by the factor , where i and k are the index of the neurons (Fig 1(b), red). To the other half of the network, the distances are estimated by the symmetry of the problem, using the network first half values. In this way, the ephaptic weight , where the dimension factor, 10−2 was previously estimated [17, 3436] using the physiological parameters described by Cunha et al. [34, 35].

The equation for a single neuron underlying the network is therefore as follows: (6) where the reset conditions are given by Vm(t)≥90mVVm(t + 1) = −5mV (hyper-polarization condition) [34, 35, 42]. As for the numerical simulation, the time series was obtained from the average activity of all neurons in the network. To calculate each in Eq (6), we use a time of 60 s (excluding the first 10 s as the transient phase), with a time step of 10−3 s. All simulations are performed in MATLAB using Euler integration with a step of 10−3 (1ms). In this way, the temporal series (Local Field Potential—LFP) is defined by the spatial average of all potential differences of the membranes in the networks, i.e: (7)

The analysis presented in this paper refers to the time series xl(t), i.e., the activity average of the neuron networks (LFP). The QIF-E code are present in the Supporting Information. The parameter values listed in Table 1 were used in our simulations.

3 Complexity and multiscale entropy

The entropy is a measure commonly associated with information to quantify complexity in systems [31, 33, 4345]. If we consider X ≔ {x1, x2, …, xM} a time series of length M, the sampling entropy (SE) is a robust and widely used tool to quantify the regularity in X [31, 32, 44, 45]. The samples xm of X are defined for different m lengths, i.e. xmX, and xm = {xl, xl+1, …, xm+l}, with 1 ≤ m < M. Note that xmxm. xm is the subset of X. Otherwise, xm is the m-th term of xm. The case m = 1 therefore provides the original time series, X, with xm = {xm}, 1 ≤ mM. The entropy of the sample is therefore defined by the following equation [32, 33, 4547]: (8) where n(m) indicates the conditional probability that two samples xm match in m points. The agreement between the samples represents a tolerance r that occurs. The tolerance match r indicates the maximum Euclidean distance between two samples (of length m) in the same time series with the same m values [31, 32, 44, 45].

However, due to the limitations imposed by the usual random noise in experimental data, there is a problem in estimating signal complexity with strong random noise [31, 32, 45]. Nevertheless, an information-poor signal can have a high entropy value due to the randomness of the data. To eliminate this problem, the entropy is estimated on different time scales to evaluate how the information is preserved along the time series. This approach is known as multiscale entropy (MSE) [31, 32, 45].

To compute the MSE, a coarse-graining procedure is used that generates new temporal series by applying the average to the subsets of size in the subsets of temporal series points [31, 32, 45]. Note that the subsets in the coarse-grained method are not the same as the samples in the SE calculation. The consecutive number of points used in the procedure provides the scaling factor τ, which generates the new replacement time series as the number of points. (9)

To obtain the MSE, a coarse-graining procedure is applied to different scale factors and a sample entropy is estimated for each new temporal series ().

In the complexity perspective, entropy is sampled across all time scales to estimate the information, variety, or randomness in the data [3133, 45]. The system complexity, K, is thus defined by calculating the entropy integral along the time scales τ [33]: (10) where τmax is the maximum τ value that matches the data size [45]. In the present work, we use a temporal series with length M = 50000, τ ∈ [2, 100], m = 2 and r = 0.15*SD, for SD the temporal series standard deviation.

Results

In the present section, we show the results of the simulations for both models used: ephaptic-off and ephaptic-on. Using the MSE tool, we compute the complexity in networks with small worlds. Fig 2(a) shows the MSE for the ephaptic-off network (small world; rewiring probability (rp) = 10%) and ephaptic-on (synaptic: small world (rp = 10%)). We note that at short scales (8 ms), the ephaptic-on network (Fig 2(a), in red) is more entropic than the ephaptic-off network (blue). On the other hand, for the intermediate range (from 8 ms to 40 ms), the ephaptic-off network exhibits greater MSE than the ephaptic-on system. At long range scales (from 40 ms to 100 ms), the ephaptic-on network became more entropic compared to the pure ephaptic-off network. In general, the complexity estimated by the MSE integral (area below curves) is larger in ephaptic-on networks (in red) than ephaptic-off (in blue) observed in Fig 2(b), for different rewiring probabilities (rp) in the small-world topology. To confirm these results, the Wilcoxon rank sum test (* → p < 0.05, **→p < 0.01, *** → p < 0.001) was performed comparing networks with the same rp values. However, it can be observed that rp = 80% shows no significant difference between the two cases.

thumbnail
Fig 2. MSE method and complexity measure for the two models: Ephaptic-off and ephaptic-on networks.

(a) Multiscale Entropy for the ephaptic-off network LFP (rp = 0.1 and ω(k) = 5, in blue) and the ephaptic-on network LFP (rp = 0.1 and ω(k) = 5, in red). Both models were simulated for N = 100. (b) Complexity (K) for different rewiring probabilities (rp) in the small-world ephaptic-off network. All these simulations were performed with N = 100. The colors are the same as in panel (a). Wilcoxon rank sum test was applied (* → p < 0.05, **→p < 0.01, *** → p < 0.001).

https://doi.org/10.1371/journal.pone.0310640.g002

Subsequently, topological simulations were performed for several numbers of neurons, denoted as N, to explore the behavior of the model for increasing N. Fig 3(a) illustrates the complexity of the small-world ephaptic-off network (rp = 10%, shown in blue) and the ephaptic-on network (rp = 10%, shown in red). It is evident that the ephaptic-on network has greater complexity than the ephaptic-off network. Furthermore, the significant discrepancy between the average curves occurs from N = 100 and increases with the inclusion of neurons in the network. Both average complexity curves show a simultaneous decrease in the slope of the curve for large N.

thumbnail
Fig 3. Complexity for ephaptic-off and ephaptic-on networks.

(a) Complexity for different neuron quantities, with rp = 0.1 and ω(k) = 5 (ephaptic-off network, in blue; ephaptic-on network, in red). Both models were simulated for N = 100. (b) Complexity for different synaptic intensity. The increase of ω(k) promotes an decrease in complexity (K). The behavior is common both systems (ephaptic-off and -on). The complexity for weaky synapses is amplified by addition of ephaptic process (ω(k) = 5 and ω(k) = 10). Otherwise, to strong synapses, the ephaptic-on system have an decrease comparing with analogous ephaptic-off (ω(k) = 30). Wilcoxon rank sum test was applied (* → p < 0.05, **→p < 0.01, *** → p < 0.001).

https://doi.org/10.1371/journal.pone.0310640.g003

To explore the behavior of complexity in response to variations in synaptic weights (ω(k)), in Fig 3(b) we investigate the effect of synaptic strength. Our results reveal that as the increase in synaptic strength occurs, it results in a reduction in the complexity of the network, regardless of whether the ephapticity is turned on or not. For lower synaptic intensities (ω(k)< 20) the complexity of the ephaptic-on network (red) is higher than that of the ephaptic-off network (blue). In contrast, at higher synaptic intensities (ω(k)> 20) the complexity of the ephaptic-on network (red) exhibits a lower value when compared to the complexity value of the ephaptic-off network (blue). The Wilcoxon rank sum test was applied to complexity data, K is larger for the ephaptic-on network from most of the cases. Only for ω(k) = 15 there is no difference in network complexity (* → p < 0.05, **→p < 0.01, *** → p < 0.001). It is notable that the complexity is greater for ephaptic-on structures at ω(k) = 10. However, for high synaptic strength, ω(k) = 30, the ephaptic-off network presents higher complexity than the combined network (ephaptic-on).

Finally, Fig 4 shows the results of complexity for different neighbor number (nb) and synaptic intensity, in networks with N = 200. Fig 4(a) shows complexity for ω(k) = 5. In the weakly ephaptic-off regime, the neighbor number inversely impacts the complexity. In other words, increasing the number of neighbor decreases the complexity of the dynamics in the network. Furthermore, the ephaptic coupling increases its complexity significantly for rp ≤30 (small-world) and only for nb = 4 Thus, for nb = 12, the results do not change significantly between synaptic and ephaptic-on networks for nb = 12. For nb = 12, the results do not change significantly between ephaptic-off and ephaptic-on connections. In Fig 4(b), the complexity of strong synaptic connections is shown. As we observed, for the strongest synaptic connection, the turn-on ephapticity (red) leads to a lower system complexity than without ephapticity (blue), regardless of the number of neighbors (nb) and the probability of reconnecting (rp). Also, the significant difference between the networks appears for almost all rp and nb (take into account the statistical fluctuations). Note that increasing neighbor number decreases complexity (See also Supporting information). However, nb = 12 exhibits less complexity than nb = 4. The Wilcoxon rank sum test was applied to complexity data (* → p < 0.05, **→p < 0.01, *** → p < 0.001)

thumbnail
Fig 4. Complexity for N = 200, with different neighborhoods.

(a) The mean value for complexity measure for network with ω(k) = 5. Note that the number of neighbors impact the complexity. However, for nb = 4, the difference between ephaptic-off and ephaptic-on networks is higher compared to nb = 12. Furthermore, in the case of rp < 50% the complexity is smaller compared to rp > 50%. (b) The same comparison that in (a), for ω(k) = 30. The values of complexity for ω(k) = 30 is inferior in comparison with the ω(k) = 5 (fig (a)). In addition, the complexity is higher between neighborhoods than fig (a). Moreover, the ephaptic-off networks exhibit larger complexity that ephaptic-on networks, see Fig 3(b). Wilcoxon rank sum test was applied (* → p < 0.05, **→p < 0.01, *** → p < 0.001).

https://doi.org/10.1371/journal.pone.0310640.g004

4 Discussion

In this study, using the QIF-E hybrid model [34], we analyzed the effect of ephaptic coupling on neural complexity in synaptic networks (ephaptic-off). We used the average time series (LFP) of the network in question to calculate the MSE on a scale that varies from 2 to 100 ms [31, 32]. In order to model the ephaptic-off network, we chose a small-world topology, recognized as the most suitable network configuration to represent complex neuronal functioning on multiple time scales. [3, 2630, 41]. The ephaptic-on model was incorporated into the network [see Fig 1]. The results revealed a significant enhancement in neuronal complexity by incorporating ephaptic coupling as an underlying mode of communication in the synaptic nervous system [17, 34, 36, 47, 48]. The discoveries indicate that ephaptic interactions could be crucial in fine-tuning the complexities of the neural network, providing valuable perspectives on intricate dynamics across multiple levels. Our findings are in harmony with recent empirical research [25, 4952].

The results depicted in Fig 2(a) reveal that within intervals shorter than 10 ms (short range) and longer than 20 ms (long range), the average entropy of the ephaptic-off (blue) network experiences a significant rise upon the introduction of ephaptic coupling (red). Courtiol et al. (2016) [53] demonstrated that this form of neuronal communication enhances complexity across both high and low frequencies, which is consistent with our findings. Conversely, it is observed that the ephaptic-off (blue) network exhibits, on average, a slightly higher entropy value only in the medium time scales, within a small range from 10 ms to 15 ms, compared to the ephaptic-on network. Consequently, oscillations in intermediate bands of the spectrum, between the high β (13 − 30 Hz) and low γ (45 − 100 Hz) bands, are related to a possible reduction in complexity when the ephaptic process is turned on [53]. The scales in the MSE in Fig 2(a) are associated with different neural oscillations, according to Courtiol et al. [53]. These results suggest that various modes of communication can take on specific roles depending on the specific neuronal task at hand, thus exerting an influence on the oscillation patterns observed in the neural signal [5458].

When evaluating the complexity of the neural network for N = 100, Fig 2(b) shows that topology associated with different synaptic re-connection rates (rp%) have no significant influence on the average complexity of the networks in the excitatory neurons network. This observation persists as an intrinsic statistical fluctuation in the randomness of the ephaptic-off network re-connection process. In conclusion, the results in Fig 2(b) suggest that, on average, network complexity is greater when neuronal communication involves ephaptic coupling, regardless of the probability of re-connection in the ephaptic-off network, for the case N = 100.

The complexity of both networks regimes increases with the addition of new neurons, as evidenced in Fig 3(a). An important observation is the marked difference between the two regimes (curves), especially for N > 100, showing always a greater degree of complexity when ephapticity is turned on. Indeed, this discrepancy amplifies with the escalation in neuron number, suggesting a potential contribution of ephaptic coupling to large-scale brain complexity (N > 200). The idea that small-world network models become more complex as the number of neurons increases is driven by various interrelated factors. Even in an ephaptic-off (synaptic) regime, studies such as Song et al. (2005) [59] and Bullmore and Sporns (2012) [7] highlight the importance of neuronal interconnections in forming complex networks. As the number of neurons increases, there is an expansion in synaptic connection opportunities (as well as ephaptic too), leading to a denser and more complex network of neuronal communication. This enables greater integration of information, as discussed by Tononi et al. (1994) [60] and Sporns et al. (2004) [61], contributing to the representation and processing of more complex information in the brain.

The increase in the number of neurons can lead to the emergence of more intricate patterns of neuronal activity, as described by Buzsáki and Draguhn (2004) [62], adding complexity (plasticity) to the system. In short, the complexity of the brain may be a consequence of the increase in the number of neurons, reflecting the intricate and highly organized nature of the nervous system. This complexity is driven by expanded neuronal interconnection, enhanced information integration, the emergence of complex activity patterns, functional diversity, and neuronal plasticity, all of which combine to create a complex, dynamic neural system. The small-world neuronal model may be under the same process.

On the contrary to what was observed in Fig 3(a) and 3(b) exhibits a decrease in complexity as the intensity of the synaptic interaction increases. We defined four synaptic strengths to simulate the complexity, ranging from the weakest synaptic strength ω(k) = 5 to a robust synaptic strength of ω(k) = 30, for N = 100. The results reveal an inversely proportional correlation between synaptic intensity (ω(k)) and complexity in both analyzed network regimes. Furthermore, in the ephaptic-on regime, the complexity significantly exceeds the complexity of the ephaptic-off regime, particularly for weak synaptic connections. This observation suggests a potential strategy to modulate the balance between synaptic and ephaptic communications and perhaps enhance the efficiency of a stable-state (healthy) nervous system. This conclusion resonates with the one in Fig 2(a) regarding the various oscillatory modes prevalent in the network signal and their associated maximum complexity timescales. Therefore, efficiency and ephaptic coupling can coexist harmoniously with synaptic communication to achieve an optimized state of brain complexity with the lowest possible energy expenditure in information transmission. Our results also highlight an even more accentuated reduction in complexity for the ephaptic-on regime compared to the ephaptic-off regime, as the intensity of synapses is strong (ω(k) = 30). Same results were observed for a mixed excitatory and inhibitory neuronal network, as seen in Supporting information.

To provide further context for our findings regarding the outcomes depicted in Fig 3(b), an intriguing study by Iara B. et al. [63] employed the concept of recurrence entropy via microstates [64] as a metric for complexity [65]. They demonstrated a decrease in entropy during the eyes-closed condition, indicating that intrinsic neural activity within the thalamocortical circuit predominates, characterized by the alpha rhythm oscillating between 8 and 12 Hz. Conversely, in the eyes-open condition, neural responses in the occipital region, responsible for visual processing, lead to desynchronized neuronal activity due to light detection on the retina. In another study, Asl M. et al. [66] elucidated spike timing-dependent plasticity as a fundamental neural mechanism, altering synaptic strengths based on the temporal alignment of pre- and postsynaptic spikes [67].

In Parkinson’s disease, for instance, dopamine depletion can induce dysfunction reliant on synaptic connectivity, fostering pathological states characterized by tightly synchronized activity in closely connected neurons, indicative of diminished plasticity and complexity [6870]. Additionally, computational investigations have delved into the relation between synchronization and desynchronization dynamics of neurons and their firing patterns, contingent upon factors such as network topology, neuronal population size, and coupling strength [71]. Furthermore, another computational study has demonstrated that spontaneous firing patterns are influenced by the topological arrangement and various synaptic structures [72]. Nobukawa et al. highlight how network complexity changes through the application of the Multiscale Entropy approach, thereby reinforcing our analyses. It’s noteworthy that these studies exclusively focus on regimes involving synaptic interactions, without considering ephaptic communication.

In order to elucidate the effects of neuronal complexity and relate them to the causality of ephapticity, we show that our findings corroborate existing empirical studies that highlight the importance of ephapticity in the organization and complexity of the brain. It is now well established that neural activity, manifested as waves or spikes, traditionally propagates through mechanisms such as synaptic transmission, gap junctions, or diffusion. However, the paper proposed by Chen Qiu (2015) [73] elucidated an alternative explanation for experimental data suggesting that neural signals may propagate via an electric field mechanism, known as ephaptic effects, to mediate the propagation of self-regenerating neural waves. This novel mechanism, involving cell-by-volume conduction, could potentially play a role in various types of propagating neural signals, including slow-wave sleep, sharp hippocampal waves, theta waves, or seizures. Another study demonstrated that ephaptic interaction alone can shape circuit function, inducing lateral inhibition of neurons [16]. This interaction influences spike timing, facilitating the development of intricate neural codes in higher processing centers. Moreover, in this study, the authors showed that neurons with large spikes can exert greater ephaptic influences on their neighbors and potentially display ephaptic asymmetry, potentially stemming from an unequal number of reciprocal synapses, may leading to a preference for the ephaptic pathway.

Hence, while specific cerebral mechanisms may differ, it is plausible that ephaptic dynamics regulate and modulate certain aspects of brain information, especially over synaptic processes during the advanced stages of neurodegenerative diseases [35]. This abnormal scenario intensify the reduction in complexity, evident in Fig 3(b) and in S1 Fig (networks with 20% of inhibitory neurons, see Supporting information), may attributable to the inflexibility (rigidity) induced by strong synaptic connections. Loss of physiological functionality, a hallmark of the degenerative process, could be the cause of the ephaptic asymmetry and synaptic rigidity, affecting a specific subset of neurons in a specific region of the brain.

The intricate interplay between synaptic and ephaptic processes, more specifically, in neurodegenerative contexts, underscores the need for a nuanced understanding of how disruptions in these mechanisms contribute to the observed alterations in brain dynamics. A recent study by Sousa et al. (2024), as instance, adapted the Hodgkin-Huxley (HH) model to examine the effects of ephaptic entrainment under thermal changes (HH-E). Focusing on subthreshold and suprathreshold neuronal regimes, it revealed the susceptibility of ephaptic entrainment to temperature variations, highlighting the relevance of ephaptic communication in neural processes influenced by temperature, such as inflammation, fever and epilepsy [74]. These insights are vital for unraveling the complexity associated with neurodegeneration, and may pave the way for targeted interventions aimed at mitigating the impact of these diseases on neural networks. As we delve deeper into the intricate dynamics of synaptic and ephaptic interactions, potential avenues for therapeutic interventions are opened, offering hope for improved management and treatment strategies for individuals affected by neurodegenerative conditions.

Fig 4 highlights the small-world characteristics present in neuronal data [3, 2630]. In Fig 4(a), for nb = 4, it is notable that the complexity undergoes a transition at rp ≈ 50%. According to the Watts-Strogatz approach, networks with rp less than 50% predominantly exhibit small-world characteristics [41]. However, networks with rp greater than 50% demonstrate characteristics of random networks. Thus, the topological structure may influence the outcomes of our study, suggesting that communication balance depends on neuronal topological properties, which is supported by other studies [3, 2630].

In Fig 4(b), it is observed that strong synaptic intensity decreases the network complexity, potentially fostering a synchronized environment among neurons in the network. Furthermore, the figure indicates that regardless of the re-connection probability, increased the synaptic intensity the ephapticity further reduces the system’s complexity. These results are consistent with those in Fig 3(b), suggesting that ephaptic communication exhibits causal dynamic behavior and plays a modulatory role within the synapse neural network. These findings suggest that ephaptic coupling may present a universal biological adaptive mechanism to optimize, together with the synapses, the organization of neural communication.

Lastly, the relationship between complexity and the first synaptic neighbors shown in Fig 4(a) and 4(b) provides insight into the interplay between synaptic and ephaptic communication. The decrease in complexity for nb = 12 is observed in both sets of synaptic strength results. Thus, we demonstrate that the findings in Fig 4 underscore the balance between ephapticity and neural synapses in brain communication structure, proposing a modulatory role of ephapticity in the central nervous system. These outcomes are in agreement concerning the effect of ephapticity in a strong synapse regime, from the study by Kyung-Seok Han and co-workers (2018) [22] that showed how ephaptic coupling promotes synchronous firing of cerebellar Purkinje cells.

These studies complement ongoing research and may shed light on the neurophysiological processes underlying brain complexity and their relationship with functional connectivity [51]. The heightened complexity indicates enhanced organization and more frequent transitions between diverse integrative states within brain networks, as evidenced by Billings (2018) [50]. This prompts the hypothesis that ephaptic coupling in vivo might play a pivotal role in memory formation and consolidation, as demonstrated by Pinotsis (2023) [24, 75].

5 Conclusion

Our research using the QIF E model revealed an interesting finding; the complexity curve generated by ephapticity activation seems to align with that produced solely through synaptic communication. This alignment suggests that synaptic processes may play a guiding role, in shaping the complexity of networks. This correlation was consistent across analyses including factors like network size, strength of synaptic connections number of connected neighbors and probability of synaptic re-connection.

Although ephaptic communication is significantly weaker (about 1000 times less intense) than synaptic communication, its activation increases system complexity by 7–13%, depending on the state/regime observed. Therefore, our findings indicate that ephapticity is not merely insignificant background noise caused by synaptic activities in the network. Ephapticity exerts a significant influence on neuronal communication and the brain’s energetic cost. This implies that it requires the same amount of energy as a pure synaptic network to improve the organization of neural connections and to increase the complexity of the same number of synaptic connections.

Our research suggests that ephapticity is a contributing factor to the enhancement of complexity in certain neural network scenarios. Implementing ephaptic communication into models can produce a better understanding of how brain structure organizes communication. This communication may be associated with the interaction between nerve cells, their various states, and the effects given by cognitive and memory processes. Moreover, our findings provide new insights into the implications of ephaptic communication in neurodegenerative disorders that affect plasticity, generating an imbalance in the dynamics of central nervous system communication. It is possible that synaptic communication alone is not sufficient to convey the full picture of brain complexity, underscoring the potentially significant role of ephaptic communication.

The QIF-E hybrid neuronal network model offers a good perspective on analyzing brain complexity by integrating both synaptic and ephaptic communication mechanisms. One of its strengths lies in capturing the intricate interplay between these two forms of neural communication, which are increasingly recognized as crucial in understanding brain function. By incorporating ephaptic effects alongside traditional synaptic transmission, such models can better simulate real-world neural dynamics, potentially revealing emergent properties and behaviors not observable in purely synaptic models. This hybrid approach enables researchers to explore how ephaptic interactions modulate neural activity and contribute to overall brain complexity.

However, this approach also has its limitations. One major challenge is accurately modeling the biophysical properties of ephaptic coupling, as these interactions are less well understood compared to synaptic transmission. Additionally, incorporating ephaptic effects into network simulations increases computational costs, potentially requiring significant computational resources and time. As a result, researchers may need to make trade-offs between model complexity and computational feasibility, limiting the size of the network and the duration of simulations. Furthermore, determining the appropriate size and scale of the network is crucial but challenging. Scaling up the network to represent the entire brain (may N≫1000) introduces computational challenges and may require simplifications or approximations that limit model accuracy. Conversely, modeling small-scale networks (may N<100) may overlook emergent properties and complex interactions that are only manifested at larger scales. Balancing these considerations while maintaining biological plausibility and computational efficiency poses a significant obstacle to hybrid neuronal network modeling.

In conclusion, QIF-E hybrid neuronal network models offer valuable insights into brain complexity by integrating synaptic and ephaptic communication mechanisms. While these models provide a powerful tool for studying neural dynamics, they also face challenges related to accurately representing ephaptic interactions, computational complexity, and determining the appropriate scale of network simulations. Addressing these limitations will be crucial for advancing our understanding of brain function using hybrid neuronal network models.

Supporting information

S1 Fig. The complexity (K) of a mixed synaptic network with 80% excitatory and 20% inhibitory neurons (ne = n*80% and ni = n*20%) is analyzed.

The figure show the complexity for synaptic intensities of |ω(k)| = 5 and |ω(k)| = 30, comparing the scenarios with ephaptic coupling off (blue) and on (red). Notably, the complexity results are analogous to those observed in Fig 3(b) of the main text. These results indicate that increasing synaptic interaction leads to an inversion in complexity values when ephaptic coupling is on versus off. The similarity between the results of purely excitatory and excitatory-inhibitory networks suggests that complexity may exhibit a universal characteristic. This implies that ephaptic coupling enhances communication efficiency in a physiological scenario, regardless of the network’s specific characteristics, connectivity, topology, and scale.

https://doi.org/10.1371/journal.pone.0310640.s001

(TIF)

S2 Fig. Complexity for all neighborhoods and rp values performed, with ω(k) = 5 and N = 100.

(a) Complexity (K) for different neighbors number (nb) and different rewiring probabilities (rp) values, to networks ephaptic-off small-world. Note that, to values of rp ≈ 10%, and nb = 4, the small-world features promote highest complexity. Otherwise, to nb = 20 the highest values of complexity are presented in random networks structure. (b) Complexity (K) for different neighbors number and different rp values, to networks ephaptic-on. Observe that the complexity values is highest than S1(a) Fig in low nb values. However, the increase of nb promotes an accentuated decrease in comparison with (a). Therefore, the small-world prevalence occurs in minors nb.

https://doi.org/10.1371/journal.pone.0310640.s002

(TIF)

S3 Fig. Complexity for all neighborhoods and rp values performed, with ω(k) = 30 and N = 100.

(a) Complexity (K) for different neighbors number (nb) and different rewiring probabilities (rp) values, to networks ephaptic-off small-world. The complexity to strong synapses is lower in comparison with S2(a) Fig, as shows by Figs 3(b) and 4(b) Complexity (K) for for different neighbors number (nb) and different rewiring probabilities (rp) values, to networks ephaptic-on. The values of complexity to combined networks is lower in comparison with (a). This results are in line with the Figs 3(b) and 4.

https://doi.org/10.1371/journal.pone.0310640.s003

(TIF)

References

  1. 1. Nunez PL. Brain, mind, and the structure of reality. Oxford University Press; 2012.
  2. 2. Bullmore E, et al. Generic aspects of complexity in brain imaging data and other biological systems. Neuroimage. 2009;47(3):1125–1134. pmid:19460447
  3. 3. Bassett DS, Gazzaniga MS. Understanding complexity in the human brain. Trends in cognitive sciences. 2011;15(5):200–209. pmid:21497128
  4. 4. Hagmann P, et al. Mapping the structural core of human cerebral cortex. PLoS biology. 2008;6(7):e159. pmid:18597554
  5. 5. Bassett DS, Brown JA, Deshpande V, Carlson JM, Grafton ST. Conserved and variable architecture of human white matter connectivity. Neuroimage. 2011;54(2):1262–1279. pmid:20850551
  6. 6. He BJ, Zempel JM, Snyder AZ, Raichle ME. The temporal structures and functional significance of scale-free brain activity. Neuron. 2010;66(3):353–369. pmid:20471349
  7. 7. Bullmore E, Sporns O. The economy of brain network organization. Nature reviews neuroscience. 2012;13(5):336–349. pmid:22498897
  8. 8. Moser EI, Kropff E, Moser MB. Place cells, grid cells, and the brain’s spatial representation system. Annu Rev Neurosci. 2008;31:69–89. pmid:18284371
  9. 9. Shallice T, Burgess P. The domain of supervisory processes and temporal organization of behaviour. Philosophical Transactions of the Royal Society of London Series B: Biological Sciences. 1996;351(1346):1405–1412. pmid:8941952
  10. 10. Dos Santos Lima GZ, et al. Mouse activity across time scales: fractal scenarios. Plos one. 2014;9(10):e105092.
  11. 11. Petersen SE, Sporns O. Brain networks and cognitive architectures. Neuron. 2015;88(1):207–219. pmid:26447582
  12. 12. Hobson JA, Pace-Schott EF. The cognitive neuroscience of sleep: neuronal systems, consciousness and learning. Nature Reviews Neuroscience. 2002;3(9):679–693. pmid:12209117
  13. 13. Shepherd GM. The synaptic organization of the brain. Oxford university press; 2003.
  14. 14. Queenan BN, Ryan TJ, Gazzaniga MS, Gallistel CR. On the research of time past: the hunt for the substrate of memory. Annals of the New York Academy of Sciences. 2017;1396(1):108–125. pmid:28548457
  15. 15. Su CY, Menuz K, Reisert J, Carlson JR. Non-synaptic inhibition between grouped neurons in an olfactory circuit. Nature. 2012;492(7427):66. pmid:23172146
  16. 16. Zhang Y, et al. Asymmetric ephaptic inhibition between compartmentalized olfactory receptor neurons. Nature communications. 2019;10(1):1–16. pmid:30952860
  17. 17. Anastassiou CA, Perin R, Markram H, Koch C. Ephaptic coupling of cortical neurons. Nature neuroscience. 2011;14(2):217–223. pmid:21240273
  18. 18. Anastassiou CA, Montgomery SM, Barahona M, Buzsáki G, Koch C. The effect of spatially inhomogeneous extracellular electric fields on neurons. Journal of Neuroscience. 2010;30(5):1925–1936. pmid:20130201
  19. 19. Shifman AR, Lewis JE. Elfenn: a generalized platform for modeling ephaptic coupling in spiking neuron models. Frontiers in Neuroinformatics. 2019;13:35. pmid:31214004
  20. 20. Jefferys J. Nonsynaptic modulation of neuronal activity in the brain: electric currents and extracellular ions. Physiological reviews. 1995;75(4):689–723. pmid:7480159
  21. 21. Weiss SA, Preuss T, Faber DS. A role of electrical inhibition in sensorimotor integration. Proceedings of the National Academy of Sciences. 2008;105(46):18047–18052.
  22. 22. Han KS, Guo C, Chen CH, Witter L, Osorno T, Regehr WG. Ephaptic coupling promotes synchronous firing of cerebellar Purkinje cells. Neuron. 2018;100(3):564–578. pmid:30293822
  23. 23. Schmidt H, Hahn G, Deco G, Knösche TR. Ephaptic coupling in white matter fibre bundles modulates axonal transmission delays. PLOS Computational Biology. 2021;17(2):e1007858. pmid:33556058
  24. 24. Pinotsis DA, Miller EK. In vivo ephaptic coupling allows memory network formation. Cerebral Cortex. 2023;33(17):9877–9895. pmid:37420330
  25. 25. Hunt T, Jones M. Fields or firings? Comparing the spike code and the electromagnetic field hypothesis. Frontiers in Psychology. 2023;14.
  26. 26. Masuda N, Aihara K. Global and local synchrony of coupled neurons in small-world networks. Biological cybernetics. 2004;90(4):302–309. pmid:15085349
  27. 27. Liu Y, Sun Z, Yang X, Xu W. Analysis of dynamical robustness of multilayer neuronal networks with inter-layer ephaptic coupling at different scales. Applied Mathematical Modelling. 2022;112:156–167.
  28. 28. Guardiola X, Diaz-Guilera A, Llas M, Pérez C. Synchronization, diversity, and topology of networks of integrate and fire oscillators. Physical Review E. 2000;62(4):5565. pmid:11089114
  29. 29. Sporns O. Structure and function of complex brain networks. Dialogues in clinical neuroscience. 2022;.
  30. 30. Bassett DS, Bullmore E. Small-world brain networks. The neuroscientist. 2006;12(6):512–523. pmid:17079517
  31. 31. Costa M, Goldberger AL, Peng CK. Multiscale entropy analysis of complex physiologic time series. Physical review letters. 2002;89(6):068102. pmid:12190613
  32. 32. Costa MD, Peng CK, Goldberger AL. Multiscale analysis of heart rate dynamics: entropy and time irreversibility measures. Cardiovascular Engineering. 2008;8:88–93. pmid:18172763
  33. 33. Zhang YC. Complexity and 1/f noise. A phase space approach. Journal de Physique I. 1991;1(7):971–977.
  34. 34. Cunha GM, Corso G, Miranda JGV, Dos Santos Lima GZ. Ephaptic entrainment in hybrid neuronal model. Scientific reports. 2022;12(1):1–10. pmid:35102158
  35. 35. Cunha GM, Corso G, Lima MM, Dos Santos Lima GZ. Electrophysiological damage to neuronal membrane alters ephaptic entrainment. Scientific Reports. 2023;13(1):11974. pmid:37488148
  36. 36. Holt GR, Koch C. Electrical interactions via the extracellular potential near cell bodies. Journal of computational neuroscience. 1999;6(2):169–184. pmid:10333161
  37. 37. Logothetis NK, Kayser C, Oeltermann A. In vivo measurement of cortical impedance spectrum in monkeys: implications for signal propagation. Neuron. 2007;55(5):809–823. pmid:17785187
  38. 38. Mechler F, Victor JD. Dipole characterization of single neurons from their extracellular action potentials. Journal of computational neuroscience. 2012;32:73–100. pmid:21667156
  39. 39. Roth A, van Rossum MC, et al. Modeling synapses. Computational modeling methods for neuroscientists. 2009;6:139–160.
  40. 40. Gerstner W, Kistler WM, Naud R, Paninski L. Neuronal dynamics: From single neurons to networks and models of cognition. Cambridge University Press; 2014.
  41. 41. Watts DJ, Strogatz SH. Collective dynamics of ‘small-world’networks. nature. 1998;393(6684):440–442. pmid:9623998
  42. 42. Izhikevich EM. Dynamical systems in neuroscience. MIT press, 2007.
  43. 43. Shannon CE. A mathematical theory of communication. The Bell system technical journal. 1948;27(3):379–423.
  44. 44. Richman JS, Moorman JR. Physiological time-series analysis using approximate entropy and sample entropy. American journal of physiology-heart and circulatory physiology. 2000;278(6):H2039–H2049. pmid:10843903
  45. 45. Humeau-Heurtier A. The multiscale entropy algorithm and its variants: A review. Entropy. 2015;17(5):3110–3123.
  46. 46. Chenxi L, Chen Y, Li Y, Wang J, Liu T. Complexity analysis of brain activity in attention-deficit/hyperactivity disorder: A multiscale entropy analysis. Brain research bulletin. 2016;124:12–20. pmid:26995277
  47. 47. Katz Schmitt. Eletric interaction between two adjacent nerve fibers. Journal of Physiology. 1940;(97):471–488. pmid:16995178
  48. 48. Arvanitaky . Effects evoked in an axon by the activity of a contiguous one. Journal of Physiology. 1942; p. 91–108.
  49. 49. Ryan TJ, Roy DS, Pignatelli M, Arons A, Tonegawa S. Engram cells retain memory under retrograde amnesia. Science. 2015;348(6238):1007–1013.
  50. 50. Billings JC, Thompson GJ, Pan WJ, Magnuson ME, Medda A, Keilholz S. Disentangling multispectral functional connectivity with wavelets. Frontiers in Neuroscience. 2018;12:812. pmid:30459548
  51. 51. Wang DJ, et al. Neurophysiological basis of multi-scale entropy of brain complexity and its relationship with functional connectivity. Frontiers in neuroscience. 2018; p. 352. pmid:29896081
  52. 52. Van Horn JD, Jacokes Z, Newman B, Henry T. Is Now the Time for Foundational Theory of Brain Connectivity? Neuroinformatics. 2023; p. 1–3.
  53. 53. Courtiol J, et al. The multiscale entropy: Guidelines for use and interpretation in brain signal analysis. Journal of neuroscience methods. 2016;273:175–190. pmid:27639660
  54. 54. Berridge MJ. Calcium regulation of neural rhythms, memory and Alzheimer’s disease. The Journal of physiology. 2014;592(2):281–293. pmid:23753528
  55. 55. Palva S, Palva JM. Functional roles of alpha-band phase synchronization in local and large-scale cortical networks. Frontiers in psychology. 2011;2:204. pmid:21922012
  56. 56. Wang XJ. Neurophysiological and computational principles of cortical rhythms in cognition. Physiological reviews. 2010;90(3):1195–1268. pmid:20664082
  57. 57. Fell J, Axmacher N. The role of phase synchronization in memory processes. Nature reviews neuroscience. 2011;12(2):105–118. pmid:21248789
  58. 58. dos Santos Lima GZ, et al. Hippocampal and cortical communication around micro-arousals in slow-wave sleep. Scientific Reports. 2019;9(1):5876. pmid:30971751
  59. 59. Song Sen and Sjöström Per Jesper and Reigl Markus and Nelson Sacha and Chklovskii Dmitri B Highly nonrandom features of synaptic connectivity in local cortical circuits, PLoS biology. (2005) 3, e68. pmid:15737062
  60. 60. Tononi Giulio and Sporns Olaf and Edelman Gerald M A measure for brain complexity: relating functional segregation and integration in the nervous system, Proceedings of the National Academy of Sciences. (1994) 91, 5033–5037.
  61. 61. Sporns Olaf and Chialvo Dante R and Kaiser Marcus and Hilgetag Claus C Organization, development and function of complex brain networks, Trends in cognitive sciences, (2004) 8, 9 418–425. pmid:15350243
  62. 62. Buzsaki Gyorgy and Draguhn Andreas Neuronal oscillations in cortical networks, Science. (2004) 304, 1926–1929. pmid:15218136
  63. 63. Iara Beatriz Silva Ferré, Gilberto Corso, Gustavo Zampier dos Santos Lima, et al. Cycling reduces the entropy of neuronal activity in the human adult cortex, bioRxiv. (2024) https://doi.org/10.1101/2024.01.31.578253, Cold Spring Harbor Laboratory.
  64. 64. Corso Gilberto and Prado Thiago de Lima and Lima Gustavo Zampier dos Santos and Kurths Jürgen and Lopes Sergio Roberto. Quantifying entropy using recurrence matrix microstates. Chaos: An Interdisciplinary Journal of Nonlinear Science. (2018) 28, 8, AIP Publishing. pmid:30180629
  65. 65. Prado T. L., et al. Maximum entropy principle in recurrence plot analysis on stochastic and chaotic systems. Chaos: An Interdisciplinary Journal of Nonlinear Science 30.4 (2020). pmid:32357677
  66. 66. Madadi Asl M, Vahabie AH, Valizadeh A, Tass PA. Spike-timing-dependent plasticity mediated by dopamine and its role in Parkinson’s disease pathophysiology. Frontiers in Network Physiology. 2022;2:817524. pmid:36926058
  67. 67. Gerstner W, Kempter R, Van Hemmen JL, Wagner H. A neuronal learning rule for sub-millisecond temporal coding. Nature. 1996;383(6595):76–78. pmid:8779718
  68. 68. Hammond C, Bergman H, Brown P. Pathological synchronization in Parkinson’s disease: networks, models and treatments. Trends in neurosciences. 2007;30(7):357–364. pmid:17532060
  69. 69. Dos Santos Lima GZ, et al. Disruption of neocortical synchronisation during slow-wave sleep in the rotenone model of Parkinson’s disease. Journal of Sleep Research. 2021;30(3):e13170. pmid:32865294
  70. 70. MMS Lima, ADS Targa, GZ Dos Santos Lima, C F Cavarsan, Torterolo Pablo. Macro and micro-sleep dysfunctions as translational biomarkers for Parkinson’s, International Review of Neurobiology, (2024) 174, 187–209.
  71. 71. Belykh I, De Lange E, Hasler M. Synchronization of bursting neurons: What matters in the network topology. Physical review letters. 2005;94(18):188101. pmid:15904412
  72. 72. Nobukawa S, Nishimura H, Yamanishi T. Temporal-specific complexity of spiking patterns in spontaneous activity induced by a dual complex network structure. Scientific Reports. 2019;9(1):12749. pmid:31484990
  73. 73. Qiu Chen and Shivacharan Rajat S and Zhang Mingming and Durand Dominique M Can neural activity propagate by endogenous electrical field?, Journal of Neuroscience. (2015) 35, 48, 15800–15811. pmid:26631463
  74. 74. de Sousa M. P. B., Cunha G. M., Corso G., dos Santos Lima G. Z. (2024). Thermal effects and ephaptic entrainment in Hodgkin–Huxley model. Scientific Reports, 14(1), 20075. pmid:39209942
  75. 75. dos Santos Lima, Gustavo Zampier, and Gabriel Moreno Cunha. How ephapticity can explain brain complexity? A brief report. academia.edu (2024).